ASSOCIATED PRESSNews 

Apple accused of not fully reporting suspected CSAM on its platforms

Apple is facing allegations of downplaying the extent of child sexual abuse material (CSAM) on its platforms. According to the National Society for the Prevention of Cruelty to Children (NSPCC), a UK-based child protection charity, Apple only reported 267 suspected CSAM cases to the National Center for Missing & Exploited Children (NCMEC) globally last year.

That pales in comparison to the 1.47 million possible cases reported by Google and the 30.6 million reported by Meta. Other platforms that reported more potential CSAM incidents than Apple in 2023 include TikTok (590,376), X (597,087), Snapchat (713,055), Xbox (1,537), and PlayStation/Sony Interactive Entertainment (3,974). Every US technology company must submit any potential CSAM cases detected on their platforms to NCMEC, which will refer cases to relevant law enforcement agencies worldwide.

The NSPCC also said Apple was involved in more CSAM cases (337) in England and Wales between April 2022 and March 2023 than it reported worldwide in a single year. The charity used Freedom of Information requests to collect the information from the police force.

As The Guardian, which first reported the NSPCC’s claim, points out, Apple’s services such as iMessage, FaceTime and iCloud all have end-to-end encryption that prevents the company from viewing content that users share with them. However, WhatsApp also has E2EE, and this service reported nearly 1.4 million suspected CSAM cases to NCMEC in 2023.

“There is a worrying discrepancy between the number of child abuse image crimes on Apple services in the UK and the number of reports they make to authorities worldwide,” said Richard Collard, head of online policy for child safety at the NSPCC. . “Apple is clearly lagging behind many of its peers in tackling child sexual abuse, when all tech companies should be investing in security and preparing for the introduction of the Online Safety Act in the UK.”

In 2021, Apple announced plans to implement a system that scans images before uploading them to iCloud and compares them to a database of known CSAM images from NCMEC and other organizations. But following a backlash from privacy and digital rights advocates, Apple delayed the rollout of its CSAM detection tools before finally ending the project in 2022.

Apple declined to comment on the NSPCC’s allegation, instead pointing to The Guardian a statement it issued when it rejected the CSAM scan plan. Apple said it chose a different strategy that “prioritizes [its] users’ security and privacy.” The company told Wired in August 2022 that “children can be protected without companies going through personal data.”

Related posts

Leave a Comment